Fast Approximate Multioutput Gaussian Processes

نویسندگان

چکیده

Gaussian processes regression models are an appealing machine learning method as they learn expressive nonlinear from exemplar data with minimal parameter tuning and estimate both the mean covariance of unseen points. However, cubic computational complexity growth number samples has been a long standing challenge. Training requires inversion $N \times N$N×N kernel at every iteration, whereas needs computation $m N$m×N kernel, where $N$N $m$m training test points, respectively. This work demonstrates how approximating using eigenvalues functions leads to approximate process significant reduction in complexity. now computing only n$N×n eigenfunction matrix $n n$n×n inverse, $n$n is selected eigenvalues. Furthermore, n$m×n matrix. Finally, special case, hyperparameter optimization completely independent samples. The proposed can regress over multiple outputs, correlations between them, their derivatives any order. reduction, capabilities, multioutput correlation learning, comparison state art demonstrated simulation examples. Finally we show approach be utilized model real human data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Multioutput Gaussian Processes through Variational Inducing Kernels

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient i...

متن کامل

Fast hierarchical Gaussian processes

While the framework of Gaussian process priors for functions is very flexible and has a number of advantages, its use within a fully Bayesian hierarchical modeling framework has been limited due to computational constraints. Most often, simple models are fit, with hyperparameters learned by maximum likelihood. But this approach understates the posterior uncertainty in inference. We consider pri...

متن کامل

Fast Kronecker Inference in Gaussian Processes with non-Gaussian Likelihoods

Gaussian processes (GPs) are a flexible class of methods with state of the art performance on spatial statistics applications. However, GPs require O(n) computations and O(n) storage, and popular GP kernels are typically limited to smoothing and interpolation. To address these difficulties, Kronecker methods have been used to exploit structure in the GP covariance matrix for scalability, while ...

متن کامل

Gaussian Processes and Fast Matrix-Vector Multiplies

Gaussian processes (GPs) provide a flexible framework for probabilistic regression. The necessary computations involve standard matrix operations. There have been several attempts to accelerate these operations based on fast kernel matrix-vector multiplications. By focussing on the simplest GP computation, corresponding to test-time predictions in kernel ridge regression, we conclude that simpl...

متن کامل

Transformation to approximate independence for locally sta- tionary Gaussian processes

We provide new approximations for the likelihood of a time series under the locally stationary Gaussian process model. The likelihood approximations are valid even in cases when the evolutionary spectrum is not smooth in the rescaled time domain. We describe a broad class of models for the evolutionary spectrum for which the approximations can be computed particularly efficiently. In developing...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Intelligent Systems

سال: 2022

ISSN: ['1941-1294', '1541-1672']

DOI: https://doi.org/10.1109/mis.2022.3169036